[[File:Bernoulli_trial_progression.svg|thumb|400px|Graphs of probability P of not observing independent events each of probability p after n Bernoulli trials vs np for various p. Three examples are shown:
Blue curve: Throwing a 6-sided die 6 times gives a 33.5% chance that 6 (or any other given number) never turns up; it can be observed that as n increases, the probability of a 1/ n-chance event never appearing after n tries rapidly converges to 1/e.
Grey curve: To get 50-50 chance of throwing a Yahtzee (5 cubic dice all showing the same number) requires 0.69 × 1296 ~ 898 throws.
Green curve: Drawing a card from a deck of playing cards without jokers 100 (1.92 × 52) times with replacement gives 85.7% chance of drawing the ace of spades at least once.]]
In the theory of probability and statistics, a Bernoulli trial (or binomial trial) is a random experiment with exactly two possible outcomes, "success" and "failure", in which the probability of success is the same every time the experiment is conducted. It is named after Jacob Bernoulli, a 17th-century Swiss mathematician, who analyzed them in his Ars Conjectandi (1713).James Victor Uspensky: Introduction to Mathematical Probability, McGraw-Hill, New York 1937, page 45
The mathematical formalization and advanced formulation of the Bernoulli trial is known as the Bernoulli process.
Since a Bernoulli trial has only two possible outcomes, it can be framed as a "yes or no" question. For example:
Success and failure are in this context labels for the two outcomes, and should not be construed literally or as value judgments. More generally, given any probability space, for any event (set of outcomes), one can define a Bernoulli trial according to whether the event occurred or not (event or complementary event). Examples of Bernoulli trials include:
Alternatively, these can be stated in terms of odds: given probability of success and of failure, the odds for are and the odds against are These can also be expressed as numbers, by dividing, yielding the odds for, , and the odds against, :
In the case that a Bernoulli trial is representing an event from finitely many equally likely outcomes, where of the outcomes are success and of the outcomes are failure, the odds for are and the odds against are This yields the following formulas for probability and odds:
describing Bernoulli trials are often encoded using the convention that 1 = "success", 0 = "failure".
Closely related to a Bernoulli trial is a binomial experiment, which consists of a fixed number of statistically independent Bernoulli trials, each with a probability of success , and counts the number of successes. A random variable corresponding to a binomial experiment is denoted by , and is said to have a binomial distribution. The probability of exactly successes in the experiment is given by:
where is a binomial coefficient.
Bernoulli trials may also lead to negative binomial distributions (which count the number of successes in a series of repeated Bernoulli trials until a specified number of failures are seen), as well as various other distributions.
When multiple Bernoulli trials are performed, each with its own probability of success, these are sometimes referred to as .Rajeev Motwani and P. Raghavan. Randomized Algorithms. Cambridge University Press, New York (NY), 1995, p.67-68
Using the equation above, the probability of exactly two tosses out of four total tosses resulting in a heads is given by:
&= {4 \choose 2} p^{2} q^{4-2} \\ &= 6 \times \left(\tfrac{1}{2}\right)^2 \times \left(\tfrac{1}{2}\right)^2 \\ &= \dfrac {3}{8}.\end{align}
As above, the probability of exactly two sixes out of three,
&= {3 \choose 2} p^{2} q^{3-2} \\ &= 3 \times \left(\tfrac{1}{6}\right)^2 \times \left(\tfrac{5}{6}\right)^1 \\ &= \dfrac {5}{72} \approx 0.069.\end{align}
|
|